4 research outputs found
Cheetah Experimental Platform Web 1.0: Cleaning Pupillary Data
Recently, researchers started using cognitive load in various settings, e.g.,
educational psychology, cognitive load theory, or human-computer interaction.
Cognitive load characterizes a tasks' demand on the limited information
processing capacity of the brain. The widespread adoption of eye-tracking
devices led to increased attention for objectively measuring cognitive load via
pupil dilation. However, this approach requires a standardized data processing
routine to reliably measure cognitive load. This technical report presents
CEP-Web, an open source platform to providing state of the art data processing
routines for cleaning pupillary data combined with a graphical user interface,
enabling the management of studies and subjects. Future developments will
include the support for analyzing the cleaned data as well as support for
Task-Evoked Pupillary Response (TEPR) studies
Detection and quantification of flow consistency in business process models
Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second, to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics addressing these challenges, each following a different view of flow consistency. We then report the results of an empirical evaluation, which indicates which metric is more effective in predicting the human perception of this feature. Moreover, two other automatic evaluations describing the performance and the computational capabilities of our metrics are reported as well
The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers
A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling